HeroSecondaryQuote

JSS component is missing React implementation. See the developer console for more information.

Where previously, research analysts would have to look into one system after another to access process data, they could now look at all of the events in a transaction timeline as a whole. Furthermore, they could dig into the transaction timelines to see what was actually happening and why it led to an expense.

Share

The process mining journey of this leading financial institution started with a simple question from the executive management team: What percentage of our transactions goes straight through without any manual work? On the surface, this question appears to be easy to answer; but it was deceptively simple…here’s why.

In the course of 100 years in operation, the institution has deployed hundreds of different systems internally, creating an extremely complex operational environment and a very high volume of enterprise data.

To answer this seemingly simple question, it was necessary to dive into the complexity of all of those systems and find a way to aggregate all of the data sources to study the execution of their transactions end to end.

Creating an end-to-end view of transactions, without coding

First, they worked internally with IT to consolidate data from 40 different systems spanning various domains within the company such as channel experience, workflow systems, record keeping systems, and all technology processes that bridge the gaps in between those systems.

With the enterprise data from these different systems, they built out a live data stream capturing two million event records per day. With this view, they were finally able to gain a truly end-to-end perspective of their transactions, enabling them to see what portion of transactions goes straight through and what portion requires some type of manual intervention.

While the primary goal of the project was to enable them to benchmark their straight-through processing rates across the industry, the secondary goal was to help them identify their best opportunities for automation, i.e., the ones with the greatest manual handling. Tackling the challenge of stringing together all of these events into a single case view enabled them to achieve both, with surprising results.

Using process mining to assess automation Return on Investment (ROI)

First, they investigated their distribution expenses, which were driving $17 million in transaction costs per year in market exposure. One example of this type of expense is called TFE, which refers to expenses that are incurred (due to market movement) when a transaction isn’t executed for a customer on the day that it’s intended to be executed. They knew that TFEs were driving $6 million of their $17 million in expenses related to distributions, representing a valuable opportunity for cost savings.

With process mining, they were able to look deeply into various distribution processes to discover which customer paths taken create the highest transaction costs. Discovering this information is critical to identifying opportunities for cost savings, as 95 percent of their transaction costs are driven by the transaction paths that require manual handling.

Where previously, research analysts would have to look into one system after another to access process data, they could now look at all of the events in a transaction timeline as a whole. Furthermore, they could dig into the transaction timelines to see what was actually happening and why it led to an expense.

By investigating timelines of their suspended transactions, they were able to identify a problematic source of unnecessary expenses: once suspended transactions were unsuspended (because all necessary data had been received), they were not being processed immediately. The transactions that were delayed by at least eight hours between being unsuspended and being processed were incurring $3.6 million in expenses. With this data-based knowledge, they initiated a business case to improve the distribution process so that newly unsuspended transactions would be handled immediately, thus eliminating this unnecessary expense.

Q&A on Process Intelligence and the Automation Fabric

Third-party content
Learn more

Demonstrating ROI of process improvement

Another subset of transactions they investigated were high-value cases of over $5,000, which represented $11 million of their $17 million in transaction expenses (but less than 1% of transaction volume). They wanted to understand how these transactions perform compared to the lesser value transactions. With process mining, they identified that one particular type of transaction was occurring three times more often in the high value cases than it was in the lower value cases. And the impact of this was $2.4 million in expenses, due to the lengthiness of the process and the market exposure thus created. With this knowledge, they were able to create a business case to improve the design of this process to handle that type of request on the same day, thereby eliminating the market exposure and related expense.

Improving customer experience with process mining

The institution also used process mining as a way to improve processes related to customer service by identifying key points in the process that were causing the most friction for customers. Using an interactive Schema View of the process, they were able to discern a particular path to customer frustration and determine the actions that caused it. Armed with this knowledge, they created a business case to flag those actions early on in the process to alert the customer, so it does not reach the point of filing a complaint.

Process mining enabled this Fortune 100 institution to move from qualitative analysis to quantitative analysis…the results delivering not only an answer to their initial question of “What percentage of our transactions goes straight through?” but an opportunity to save $6 million in data-driven process improvements.

Bruceorcutt 99X99

Bruce Orcutt

Chief Marketing Officer at ABBYY

Bruce Orcutt is a passionate marketing and leadership executive focused on driving market growth and awareness in digital transformation and intelligent automation. As Chief Marketing Officer at ABBYY, Orcutt leads the global product strategy, go to market, launch, pricing, competitive analysis, analyst relations, communications and thought leadership, win/loss analysis, and sales enablement for the entire ABBYY intelligent automation portfolio. His goal is to help global enterprises understand the hype from reality of the latest artificial intelligence (AI) technologies and ensure they leverage purpose-built AI to achieve their transformation goals.

Orcutt was previously SVP of Product Marketing where he led the product strategy transforming ABBYY from an OCR technology vendor to ISVs to the leading provider of intelligent document processing (IDP) for global enterprises. He helped conceive, develop, and launch two new SaaS platforms for IDP and process mining with vertical focus areas in financial services, insurance, healthcare, government, and transportation and logistics. Additionally, Orcutt helped migrate the company and portfolio to subscription and recurring revenue models.

He is a thought leader with market domain expertise in AI, machine learning, process and task mining, document and data capture, digital transformation, intelligent document processing, and hyperautomation. He also has deep experience with mobile platforms, customer experience, and has a passion for customer excellence.

Connect with Bruce on LinkedIn.

Subscribe for updates

Get updated on the latest insights and perspectives for business & technology leaders

Loading...

ConnectWithUs

JSS component is missing React implementation. See the developer console for more information.